Block-Term Tensor Decomposition: Model Selection and Computation
نویسندگان
چکیده
The so-called block-term decomposition (BTD) tensor model has been recently receiving increasing attention due to its enhanced ability of representing systems and signals that are composed blocks rank higher than one, a scenario encountered in numerous diverse applications. Its uniqueness approximation have thus thoroughly studied. Nevertheless, the challenging problem estimating BTD structure, namely number block terms their individual ranks, only started attract significant attention. In this paper, novel method selection computation is proposed, based on idea imposing column sparsity xmlns:xlink="http://www.w3.org/1999/xlink">jointly factors xmlns:xlink="http://www.w3.org/1999/xlink">hierarchical manner ranks as numbers factor columns non-negligible magnitude. Following successive upper bound minimization (BSUM) approach for proposed optimization shown result an alternating hierarchical iteratively reweighted least squares (HIRLS) algorithm, which fast converging enjoys high computational efficiency, it relies iterations small-sized sub-problems with closed-form solutions. Simulation results both synthetic examples hyper-spectral image denoising application reported, demonstrate superiority scheme over state-of-the-art success rate estimation well time convergence while attaining comparable performance.
منابع مشابه
Block-Decoupling Multivariate Polynomials Using the Tensor Block-Term Decomposition
We present a tensor-based method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a three-way array (tensor) by stacking Jacobian matrix evaluations of the function behind each other. It is shown that a blockterm decomposition of this tensor provides the necessary...
متن کاملLearning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition
Recurrent Neural Networks (RNNs) are powerful sequence modeling tools. However, when dealing with high dimensional inputs, the training of RNNs becomes computational expensive due to the large number of model parameters. This hinders RNNs from solving many important computer vision tasks, such as Action Recognition in Videos and Image Captioning. To overcome this problem, we propose a compact a...
متن کاملBlock term decomposition for modelling epileptic seizures
Recordings of neural activity, such as EEG, are an inherent mixture of different ongoing brain processes as well as artefacts and are typically characterised by low signal-to-noise ratio. Moreover, EEG datasets are often inherently multidimensional, comprising information in time, along different channels, subjects, trials, etc. Additional information may be conveyed by expanding the signal int...
متن کاملLinked PARAFAC/CP Tensor Decomposition and Its Fast Implementation for Multi-block Tensor Analysis
In this paper we propose a new flexible group tensor analysis model called the linked CP tensor decomposition (LCPTD). The LCPTD method can decompose given multiple tensors into common factor matrices, individual factor matrices, and core tensors, simultaneously. We applied the Hierarchical Alternating Least Squares (HALS) algorithm to the LCPTD model; besides we impose additional constraints t...
متن کاملCoupled rank-(Lm, Ln, •) block term decomposition by coupled block simultaneous generalized Schur decomposition
Coupled decompositions of multiple tensors are fundamental tools for multi-set data fusion. In this paper, we introduce a coupled version of the rank-(Lm, Ln, ·) block term decomposition (BTD), applicable to joint independent subspace analysis. We propose two algorithms for its computation based on a coupled block simultaneous generalized Schur decomposition scheme. Numerical results are given ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Journal of Selected Topics in Signal Processing
سال: 2021
ISSN: ['1941-0484', '1932-4553']
DOI: https://doi.org/10.1109/jstsp.2021.3051488